Maximum A Posteriori Maximum Entropy Order Determination - Signal Processing, IEEE Transactions on

نویسنده

  • Luc Knockaert
چکیده

An instance crucial to most problems in signal processing is the selection of the order of a presupposed model. Examples are the determination of the putative number of signals present in white Gaussian noise or the number of noisecontaminated sources impinging on a passive sensor array. It is shown that maximum a posteriori Bayesian arguments, coupled with maximum entropy considerations, offer an operational and consistent model order selection scheme, competitive with the minimum description length criterion.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Maximum a posteriori maximum entropy order determination

An instance crucial to most problems in signal processing is the selection of the order of a presupposed model. Examples are the determination of the putative number of signals present in white Gaussian noise or the number of noise-contaminated sources impinging on a passive sensor array. It is shown that Maximum a Posteriori Bayesian arguments, coupled with Maximum Entropy considerations, offe...

متن کامل

Thresholding using two-dimensional histogram and fuzzy entropy principle

This paper presents a thresholding approach by performing fuzzy partition on a two-dimensional (2-D) histogram based on fuzzy relation and maximum fuzzy entropy principle. The experiments with various gray level and color images have demonstrated that the proposed approach outperforms the 2-D nonfuzzy approach and the one dimensional (1-D) fuzzy partition approach.

متن کامل

Image Bit-Depth Enhancement via Maximum A Posteriori Estimation of AC Signal

When images at low bit-depth are rendered at high bit-depth displays, missing least significant bits need to be estimated. We study the image bit-depth enhancement problem: estimating an original image from its quantized version from a minimum mean squared error (MMSE) perspective. We first argue that a graph-signal smoothness prior-one defined on a graph embedding the image structure-is an app...

متن کامل

IEEE TRANSACTIONS ON INFORMATION THEORY VOL NO JANUARY Maximum Independence and Mutual Information

If I I Ik are random boolean variables and the joint probabilities up to the k st order are known the values of the k th order probabilities maximizing the overall entropy have been de ned as the maximum independence estimate In the paper some contributions deriving from the de ni tion of maximum independence probabilities are proposed First it is shown that the maximum independence values are ...

متن کامل

Mean and variance of implicitly defined biased estimators (such as penalized maximum likelihood): applications to tomography

Many estimators in signal processing problems are defined implicitly as the maximum of some objective function. Examples of implicitly defined estimators include maximum likelihood, penalized likelihood, maximum a posteriori, and nonlinear least squares estimation. For such estimators, exact analytical expressions for the mean and variance are usually unavailable. Therefore, investigators usual...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004